# Multi-domain Pre-training
GECKO 7B
Apache-2.0
GECKO is a 7-billion-parameter decoder-only Transformer model trained on Korean, English, and code, released under the Apache 2.0 license.
Large Language Model
Transformers Supports Multiple Languages

G
kifai
43
12
Wav2vec2 Large Robust Ft Swbd 300h
Apache-2.0
This model is a fine-tuned version of Facebook's Wav2Vec2-Large-Robust, specifically optimized for telephone speech recognition tasks, using 300 hours of Switchboard telephone speech corpus for fine-tuning.
Speech Recognition
Transformers English

W
facebook
2,543
20
Featured Recommended AI Models